6,770 research outputs found

    Distributed and parallel sparse convex optimization for radio interferometry with PURIFY

    Full text link
    Next generation radio interferometric telescopes are entering an era of big data with extremely large data sets. While these telescopes can observe the sky in higher sensitivity and resolution than before, computational challenges in image reconstruction need to be overcome to realize the potential of forthcoming telescopes. New methods in sparse image reconstruction and convex optimization techniques (cf. compressive sensing) have shown to produce higher fidelity reconstructions of simulations and real observations than traditional methods. This article presents distributed and parallel algorithms and implementations to perform sparse image reconstruction, with significant practical considerations that are important for implementing these algorithms for Big Data. We benchmark the algorithms presented, showing that they are considerably faster than their serial equivalents. We then pre-sample gridding kernels to scale the distributed algorithms to larger data sizes, showing application times for 1 Gb to 2.4 Tb data sets over 25 to 100 nodes for up to 50 billion visibilities, and find that the run-times for the distributed algorithms range from 100 milliseconds to 3 minutes per iteration. This work presents an important step in working towards computationally scalable and efficient algorithms and implementations that are needed to image observations of both extended and compact sources from next generation radio interferometers such as the SKA. The algorithms are implemented in the latest versions of the SOPT (https://github.com/astro-informatics/sopt) and PURIFY (https://github.com/astro-informatics/purify) software packages {(Versions 3.1.0)}, which have been released alongside of this article.Comment: 25 pages, 5 figure

    Variations of the Greenberg Unrelated Question binary model

    Get PDF
    We explore different variations of the Greenberg Unrelated Question RRT model for a binary response (yes or no). In one variation, we allow m independent responses from each respondent. In another variation, we use inverse sampling where we record the number of responses leading up to the kth "yes" response. It turns out that for m > 1, the variance (theoretical and empirical) of the multiple independent response model decreases significantly relative to the regular Greenberg et al. (1969) model (m = 1). Furthermore, it was also noticed that for k > 1, the variance (theoretical and empirical) of the inverse sampling model decreases significantly as well relative to the inverse sampling model for k = 1. Thus, it turns out that both variations produce more efficient models. These results have been validated by theoretical comparisons, extensive computer simulations, and a field survey

    TM plasmonic modes in a multilayer graphene-dielectric structure

    Get PDF
    Optical and electronic properties of multilayer systems have been extensively studied in the last years due to its potential applications in high-performance optoelectronic and photonic devices. In particular, the role of plasmonic modes is critical in such systems leading to improvements in solar cells efficiency, detection of biosensors, Raman signal enhancement, among others. In this work, we study the plasmonic modes in a multilayer system composed of graphene layers embedded within dielectric materials. The dispersion relation of plasmonic modes is obtained by calculating the poles of reflectivity using the transfer matrix method. We show the attenuated total reflection spectra for a multilayer graphene-dielectric structure, and determine the optimum distance between the prism and the multilayer system for detecting graphene plasmons in the Otto configuration. Additional to the well-known plasmonics bands, when we consider the interband and intraband contribution of graphene's conductivity, and large wavevectors parallel to graphene's plane, all plasmonic bands have an asymptotic behavior. Besides, an upper mode emerges. Finally, it is important to highlight that the number of branches in the plasmonic relation dispersion depend on the number of graphene sheets

    A global assessment of the potential distribution of naturalized and planted populations of the ornamental alien tree Schinus molle

    Get PDF
    Estudio sobre la potencial invasión del pirul, una planta naturalizada cuyo origen es el sur de América.The Peruvian Peppertree (Schinus molle L.) is an evergreen tree native to semiarid environments of Peru and Bolivia in South America. This tree has been introduced and widely planted for ornamental and forestry purposes in several semiarid regions of the world because its seedlings are easily established and have a high survival rate; it also grows quickly, and it is tolerant of dry climates. We compared the global and regional niches of naturalized and planted populations of S. molle in order to examine the invasive stages and potential distribution of this species in four regions of the world. This work provides a novel approach for understanding the invasion dynamics of S. molle in these areas and elucidates the ecological processes that bring about such invasions. Most naturalized and planted populations were found to be in equilibrium with the environment. In its native range as well as in Australia and South Africa the models of the coverage area of habitat suitability for natural populations were the highest, whereas the coverage area of planted populations was lower. For planted populations in Australia and South Africa, a large percentage of predicted presences fell within sink populations. The invasion stages of S. molle vary across regions in its adventive range; this result may be attributable to residence time as well as climatic and anthropic factors that have contributed to the spread of populations.SEMARNAT-CONACYT [Grant FSSEMARNAT01-C-2018-1-A3-S-80837]

    EXTENDING THE SHEBA PROPAGATION MODEL TO REDUCE PARAMETER-RELATED UNCERTAINTIES

    Get PDF
    Heliophysics is the branch of physics that investigates the interactions and cor-relation of different events across the Solar System. The mathematical modelsthat describe and predict how physical events move across the solar system (ie.Propagation Models) are of great relevance. These models depend on parame-ters that users must set, hence the ability to correctly set the values is key toreliable simulations. Traditionally, parameter values can be inferred from dataeither at the source (the Sun) or arrival point (the target) or can be extrapo-lated from common knowledge of the event under investigation. Another way ofsetting parameters for Propagation Models is proposed here: instead of guess-ing a priori parameters from scientific data or common knowledge, the model isexecuted as a parameter-sweep job and selects a posteriori the parameters thatyield results most compatible with the event data. In either case (a priori anda posteriori), the correct use of Propagation Models requires information toeither select the parameters, validate the results, or both. In order to do so, itis necessary to access sources of information. For this task, the HELIO projectproves very effective as it offers the most comprehensive integrated informationsystem in this domain and provides access and coordination to services to mineand analyze data. HELIO also provides a Propagation Model called SHEBA,the extension of which is currently being developed within the SCI-BUS project(a coordinated effort for the development of a framework capable of offering toscience gateways seamless access to major computing and data infrastructures)

    A WORKFLOW-ORIENTED APPROACH TO PROPAGATION MODELS IN HELIOPHYSICS

    Get PDF
    The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop) eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact tim

    Accuracy and Precision of the COSMED K5 Portable Analyser

    Get PDF
    The main aims of this study were to determine the accuracy of the portable metabolic cart K5 by comparison with a stationary metabolic cart (Vyntus CPX), to check on the validity of Vyntus CPX using a butane combustion test, and to assess the reliability of K5 during prolonged walks in the field. For validation, measurements were consecutively performed tests with both devices at rest and during submaximal exercise (bicycling) at low (60 W) and moderate intensities (130–160 W) in 16 volunteers. For the reliability study, 14 subjects were measured two times during prolonged walks (13 km, at 5 km/h), with the K5 set in mixing chamber (Mix) mode. Vyntus measured the stoichiometric RQ of butane combustion with high accuracy (error <1.6%) and precision (CV <0.5%), at VO2 values between 0.788 and 6.395 L/min. At rest and 60 W, there was good agreement between Vyntus and K5 (breath-by-breath, B×B) in VO2, VCO2, RER, and energy expenditure, while in Mix mode the K5 overestimated VO2 by 13.4 and 5.8%, respectively. Compared to Vyntus, at moderate intensity the K5 in B×B mode underestimated VO2, VCO2, and energy expenditure by 6.6, 6.9, and 6.6%, respectively. However, at this intensity there was an excellent agreement between methods in RER and fat oxidation. In Mix mode, K5 overestimated VO2 by 5.8 and 4.8%, at 60 W and the higher intensity, respectively. The K5 had excellent reliability during the field tests. Total energy expenditure per Km was determined with a CV for repeated measurements of 4.5% (CI: 3.2–6.9%) and a concordance correlation coefficient of 0.91, similar to the variability in VO2. This high reproducibility was explained by the low variation of FEO2 measurements, which had a CV of 0.9% (CI: 0.7–1.5%) combined with a slightly greater variability of FECO2, VE, VCO2, and RER. In conclusion, the K5 is an excellent portable metabolic cart which is almost as accurate as a state-of-art stationary metabolic cart, capable of measuring precisely energy expenditure in the field, showing a reliable performance during more than 2 h of continuous work. At high intensities, the mixing-chamber mode is more accurate than the B×B mode

    Diffuse interstitial fibrosis assessed by cardiac magnetic resonance is associated with dispersion of ventricular repolarization in patients with hypertrophic cardiomyopathy

    Get PDF
    Background Hypertrophic cardiomyopathy (HCM) is characterized by myocyte hypertrophy, disarray, fibrosis, and increased risk for ventricular arrhythmias. Increased QT dispersion has been reported in patients with HCM, but the underlying mechanisms have not been completely elucidated. In this study, we examined the relationship between diffuse interstitial fibrosis, replacement fibrosis, QTc dispersion and ventricular arrhythmias in patients with HCM. We hypothesized that fibrosis would slow impulse propagation and increase dispersion of ventricular repolarization, resulting in increased QTc dispersion on surface electrocardiogram (ECG) and ventricular arrhythmias. Methods ECG and cardiac magnetic resonance (CMR) image analyses were performed retrospectively in 112 patients with a clinical diagnosis of HCM. Replacement fibrosis was assessed by measuring late gadolinium (Gd) enhancement (LGE), using a semi-automated threshold technique. Diffuse interstitial fibrosis was assessed by measuring T1 relaxation times after Gd administration, using the Look?Locker sequence. QTc dispersion was measured digitally in the septal/anterior (V1?V4), inferior (II, III, and aVF), and lateral (I, aVL, V5, and V6) lead groups on surface ECG. Results All patients had evidence of asymmetric septal hypertrophy. LGE was evident in 70 (63%) patients; the median T1 relaxation time was 411±38æms. An inverse correlation was observed between T1 relaxation time and QTc dispersion in leads V1?V4 (p\u3c 0.001). Patients with HCM who developed sustained ventricular tachycardia had slightly higher probability of increased QTc dispersion in leads V1?V4 (odds ratio, 1.011 [1.004?1.0178, p=0.003). We found no correlation between presence and percentage of LGE and QTc dispersion. Conclusion Diffuse interstitial fibrosis is associated with increased dispersion of ventricular repolarization in leads, reflecting electrical activity in the hypertrophied septum. Interstitial fibrosis combined with ion channel/gap junction remodeling in the septum could lead to inhomogeneity of ventricular refractoriness, resulting in increased QTc dispersion in leads V1?V4

    N-1-methylnicotinamide is a signalling molecule produced in skeletal muscle coordinating energy metabolism

    Get PDF
    Obesity is a major health problem, and although caloric restriction and exercise are successful strategies to lose adipose tissue in obese individuals, a simultaneous decrease in skeletal muscle mass, negatively effects metabolism and muscle function. To deeper understand molecular events occurring in muscle during weight-loss, we measured the expressional change in human skeletal muscle following a combination of severe caloric restriction and exercise over 4 days in 15 Swedish men. Key metabolic genes were regulated after the intervention, indicating a shift from carbohydrate to fat metabolism. Nicotinamide N-methyltransferase (NNMT) was the most consistently upregulated gene following the energy-deficit exercise. Circulating levels of N-1-methylnicotinamide (MNA), the product of NNMT activity, were doubled after the intervention. The fasting-fed state was an important determinant of plasma MNA levels, peaking at similar to 18 h of fasting and being lowest similar to 3 h after a meal. In culture, MNA was secreted by isolated human myotubes and stimulated lipolysis directly, with no effect on glucagon or insulin secretion. We propose that MNA is a novel myokine that enhances the utilization of energy stores in response to low muscle energy availability. Future research should focus on applying MNA as a biomarker to identify individuals with metabolic disturbances at an early stage.Peer reviewe

    Critical Care Health Informatics Collaborative (CCHIC): Data, tools and methods for reproducible research: A multi-centre UK intensive care database.

    Get PDF
    OBJECTIVE: To build and curate a linkable multi-centre database of high resolution longitudinal electronic health records (EHR) from adult Intensive Care Units (ICU). To develop a set of open-source tools to make these data 'research ready' while protecting patient's privacy with a particular focus on anonymisation. MATERIALS AND METHODS: We developed a scalable EHR processing pipeline for extracting, linking, normalising and curating and anonymising EHR data. Patient and public involvement was sought from the outset, and approval to hold these data was granted by the NHS Health Research Authority's Confidentiality Advisory Group (CAG). The data are held in a certified Data Safe Haven. We followed sustainable software development principles throughout, and defined and populated a common data model that links to other clinical areas. RESULTS: Longitudinal EHR data were loaded into the CCHIC database from eleven adult ICUs at 5 UK teaching hospitals. From January 2014 to January 2017, this amounted to 21,930 and admissions (18,074 unique patients). Typical admissions have 70 data-items pertaining to admission and discharge, and a median of 1030 (IQR 481-2335) time-varying measures. Training datasets were made available through virtual machine images emulating the data processing environment. An open source R package, cleanEHR, was developed and released that transforms the data into a square table readily analysable by most statistical packages. A simple language agnostic configuration file will allow the user to select and clean variables, and impute missing data. An audit trail makes clear the provenance of the data at all times. DISCUSSION: Making health care data available for research is problematic. CCHIC is a unique multi-centre longitudinal and linkable resource that prioritises patient privacy through the highest standards of data security, but also provides tools to clean, organise, and anonymise the data. We believe the development of such tools are essential if we are to meet the twin requirements of respecting patient privacy and working for patient benefit. CONCLUSION: The CCHIC database is now in use by health care researchers from academia and industry. The 'research ready' suite of data preparation tools have facilitated access, and linkage to national databases of secondary care is underway.NIH
    corecore